Research
Security News
Quasar RAT Disguised as an npm Package for Detecting Vulnerabilities in Ethereum Smart Contracts
Socket researchers uncover a malicious npm package posing as a tool for detecting vulnerabilities in Etherium smart contracts.
The mongodb npm package is the official Node.js driver for MongoDB. It provides a high-level API to connect to and interact with MongoDB databases. With this package, developers can perform CRUD operations, manage database connections, and work with MongoDB features like transactions, indexes, and aggregation.
Connecting to a MongoDB database
This code sample demonstrates how to connect to a MongoDB database using the MongoClient object provided by the mongodb package.
const { MongoClient } = require('mongodb');
const url = 'mongodb://localhost:27017';
const client = new MongoClient(url);
async function connect() {
try {
await client.connect();
console.log('Connected to MongoDB');
} catch (e) {
console.error(e);
}
}
connect();
CRUD Operations
This code sample shows how to perform CRUD (Create, Read, Update, Delete) operations on a MongoDB collection using the mongodb package.
const { MongoClient } = require('mongodb');
const url = 'mongodb://localhost:27017';
const client = new MongoClient(url);
const dbName = 'myDatabase';
async function crudOperations() {
try {
await client.connect();
const db = client.db(dbName);
const collection = db.collection('documents');
// Create a document
await collection.insertOne({ a: 1 });
// Read documents
const docs = await collection.find({}).toArray();
// Update a document
await collection.updateOne({ a: 1 }, { $set: { b: 1 } });
// Delete a document
await collection.deleteOne({ b: 1 });
} catch (e) {
console.error(e);
} finally {
await client.close();
}
}
crudOperations();
Index Management
This code sample illustrates how to manage indexes in a MongoDB collection, including creating an index and listing all indexes.
const { MongoClient } = require('mongodb');
const url = 'mongodb://localhost:27017';
const client = new MongoClient(url);
const dbName = 'myDatabase';
async function manageIndexes() {
try {
await client.connect();
const db = client.db(dbName);
const collection = db.collection('documents');
// Create an index
await collection.createIndex({ a: 1 });
// List indexes
const indexes = await collection.indexes();
console.log(indexes);
} catch (e) {
console.error(e);
} finally {
await client.close();
}
}
manageIndexes();
Aggregation
This code sample demonstrates how to use the aggregation framework provided by MongoDB to process data and compute aggregated results.
const { MongoClient } = require('mongodb');
const url = 'mongodb://localhost:27017';
const client = new MongoClient(url);
const dbName = 'myDatabase';
async function aggregateData() {
try {
await client.connect();
const db = client.db(dbName);
const collection = db.collection('documents');
// Perform an aggregation query
const aggregation = await collection.aggregate([
{ $match: { a: 1 } },
{ $group: { _id: '$b', total: { $sum: 1 } } }
]).toArray();
console.log(aggregation);
} catch (e) {
console.error(e);
} finally {
await client.close();
}
}
aggregateData();
Mongoose is an Object Data Modeling (ODM) library for MongoDB and Node.js. It manages relationships between data, provides schema validation, and is used to translate between objects in code and the representation of those objects in MongoDB. Mongoose offers a more structured approach to data handling with predefined schemas compared to the flexibility of the mongodb package.
Couchbase is the official Node.js client library for the Couchbase database. While Couchbase is a different NoSQL database system with its own set of features and capabilities, the couchbase npm package offers similar functionalities in terms of CRUD operations, connection management, and querying as the mongodb package does for MongoDB.
Redis is an in-memory data structure store, used as a database, cache, and message broker. The npm package for Redis provides Node.js bindings to the Redis server. It is similar to mongodb in that it allows for data storage and retrieval, but it operates in-memory and is typically used for different use cases such as caching.
what | where |
---|---|
documentation | http://mongodb.github.io/node-mongodb-native/ |
apidoc | http://mongodb.github.io/node-mongodb-native/ |
source | https://github.com/mongodb/node-mongodb-native |
mongodb | http://www.mongodb.org/ |
Think you’ve found a bug? Want to see a new feature in node-mongodb-native? Please open a case in our issue management tool, JIRA:
Bug reports in JIRA for all driver projects (i.e. NODE, PYTHON, CSHARP, JAVA) and the Core Server (i.e. SERVER) project are public.
http://jira.mongodb.org/browse/NODE
To install the most recent release from npm, run:
npm install mongodb
That may give you a warning telling you that bugs['web'] should be bugs['url'], it would be safe to ignore it (this has been fixed in the development version)
To install the latest from the repository, run::
npm install path/to/node-mongodb-native
This is a node.js driver for MongoDB. It's a port (or close to a port) of the library for ruby at http://github.com/mongodb/mongo-ruby-driver/.
A simple example of inserting a document.
var MongoClient = require('mongodb').MongoClient
, format = require('util').format;
MongoClient.connect('mongodb://127.0.0.1:27017/test', function(err, db) {
if(err) throw err;
var collection = db.collection('test_insert');
collection.insert({a:2}, function(err, docs) {
collection.count(function(err, count) {
console.log(format("count = %s", count));
});
// Locate all the entries using find
collection.find().toArray(function(err, results) {
console.dir(results);
// Let's close the db
db.close();
});
});
})
To store and retrieve the non-JSON MongoDb primitives (ObjectID, Long, Binary, Timestamp, DBRef, Code).
In particular, every document has a unique _id
which can be almost any type, and by default a 12-byte ObjectID is created. ObjectIDs can be represented as 24-digit hexadecimal strings, but you must convert the string back into an ObjectID before you can use it in the database. For example:
// Get the objectID type
var ObjectID = require('mongodb').ObjectID;
var idString = '4e4e1638c85e808431000003';
collection.findOne({_id: new ObjectID(idString)}, console.log) // ok
collection.findOne({_id: idString}, console.log) // wrong! callback gets undefined
Here are the constructors the non-Javascript BSON primitive types:
// Fetch the library
var mongo = require('mongodb');
// Create new instances of BSON types
new mongo.Long(numberString)
new mongo.ObjectID(hexString)
new mongo.Timestamp() // the actual unique number is generated on insert.
new mongo.DBRef(collectionName, id, dbName)
new mongo.Binary(buffer) // takes a string or Buffer
new mongo.Code(code, [context])
new mongo.Symbol(string)
new mongo.MinKey()
new mongo.MaxKey()
new mongo.Double(number) // Force double storage
If you are running a version of this library has the C/C++ parser compiled, to enable the driver to use the C/C++ bson parser pass it the option native_parser:true like below
// using native_parser:
MongoClient.connect('mongodb://127.0.0.1:27017/test'
, {db: {native_parser: true}}, function(err, db) {})
The C++ parser uses the js objects both for serialization and deserialization.
The source code is available at http://github.com/mongodb/node-mongodb-native. You can either clone the repository or download a tarball of the latest release.
Once you have the source you can test the driver by running
$ node test/runner.js -t functional
in the main directory. You will need to have a mongo instance running on localhost for the integration tests to pass.
For examples look in the examples/ directory. You can execute the examples using node.
$ cd examples
$ node queries.js
The GridStore class allows for storage of binary files in mongoDB using the mongoDB defined files and chunks collection definition.
For more information have a look at Gridstore
For more information about how to connect to a replicaset have a look at the extensive documentation Documentation
Defining your own primary key factory allows you to generate your own series of id's (this could f.ex be to use something like ISBN numbers). The generated the id needs to be a 12 byte long "string".
Simple example below
var MongoClient = require('mongodb').MongoClient
, format = require('util').format;
// Custom factory (need to provide a 12 byte array);
CustomPKFactory = function() {}
CustomPKFactory.prototype = new Object();
CustomPKFactory.createPk = function() {
return new ObjectID("aaaaaaaaaaaa");
}
MongoClient.connect('mongodb://127.0.0.1:27017/test', {'pkFactory':CustomPKFactory}, function(err, db) {
if(err) throw err;
db.dropDatabase(function(err, done) {
db.createCollection('test_custom_key', function(err, collection) {
collection.insert({'a':1}, function(err, docs) {
collection.find({'_id':new ObjectID("aaaaaaaaaaaa")}).toArray(function(err, items) {
console.dir(items);
// Let's close the db
db.close();
});
});
});
});
});
If this document doesn't answer your questions, see the source of Collection or Cursor, or the documentation at MongoDB for query and update formats.
The find method is actually a factory method to create
Cursor objects. A Cursor lazily uses the connection the first time
you call nextObject
, each
, or toArray
.
The basic operation on a cursor is the nextObject
method
that fetches the next matching document from the database. The convenience
methods each
and toArray
call nextObject
until the cursor is exhausted.
Signatures:
var cursor = collection.find(query, [fields], options);
cursor.sort(fields).limit(n).skip(m).
cursor.nextObject(function(err, doc) {});
cursor.each(function(err, doc) {});
cursor.toArray(function(err, docs) {});
cursor.rewind() // reset the cursor to its initial state.
Useful chainable methods of cursor. These can optionally be options of find
instead of method calls:
.limit(n).skip(m)
to control paging..sort(fields)
Order by the given fields. There are several equivalent syntaxes:.sort({field1: -1, field2: 1})
descending by field1, then ascending by field2..sort([['field1', 'desc'], ['field2', 'asc']])
same as above.sort([['field1', 'desc'], 'field2'])
same as above.sort('field1')
ascending by field1Other options of find
:
fields
the fields to fetch (to avoid transferring the entire document)tailable
if true, makes the cursor tailable.batchSize
The number of the subset of results to request the database
to return for every request. This should initially be greater than 1 otherwise
the database will automatically close the cursor. The batch size can be set to 1
with batchSize(n, function(err){})
after performing the initial query to the database.hint
See Optimization: hint.explain
turns this into an explain query. You can also call
explain()
on any cursor to fetch the explanation.snapshot
prevents documents that are updated while the query is active
from being returned multiple times. See more
details about query snapshots.timeout
if false, asks MongoDb not to time out this cursor after an
inactivity period.For information on how to create queries, see the MongoDB section on querying.
var MongoClient = require('mongodb').MongoClient
, format = require('util').format;
MongoClient.connect('mongodb://127.0.0.1:27017/test', function(err, db) {
if(err) throw err;
var collection = db
.collection('test')
.find({})
.limit(10)
.toArray(function(err, docs) {
console.dir(docs);
});
});
Signature:
collection.insert(docs, options, [callback]);
where docs
can be a single document or an array of documents.
Useful options:
w:1
Should always set if you have a callback.See also: MongoDB docs for insert.
var MongoClient = require('mongodb').MongoClient
, format = require('util').format;
MongoClient.connect('mongodb://127.0.0.1:27017/test', function(err, db) {
if(err) throw err;
db.collection('test').insert({hello: 'world'}, {w:1}, function(err, objects) {
if (err) console.warn(err.message);
if (err && err.message.indexOf('E11000 ') !== -1) {
// this _id was already inserted in the database
}
});
});
Note that there's no reason to pass a callback to the insert or update commands
unless you use the w:1
option. If you don't specify w:1
, then
your callback will be called immediately.
The update operation will update the first document that matches your query
(or all documents that match if you use multi:true
).
If w:1
, upsert
is not set, and no documents match, your callback will return 0 documents updated.
See the MongoDB docs for
the modifier ($inc
, $set
, $push
, etc.) formats.
Signature:
collection.update(criteria, objNew, options, [callback]);
Useful options:
w:1
Should always set if you have a callback.multi:true
If set, all matching documents are updated, not just the first.upsert:true
Atomically inserts the document if no documents matched.Example for update
:
var MongoClient = require('mongodb').MongoClient
, format = require('util').format;
MongoClient.connect('mongodb://127.0.0.1:27017/test', function(err, db) {
if(err) throw err;
db.collection('test').update({hi: 'here'}, {$set: {hi: 'there'}}, {w:1}, function(err) {
if (err) console.warn(err.message);
else console.log('successfully updated');
});
});
findAndModify
is like update
, but it also gives the updated document to
your callback. But there are a few key differences between findAndModify and
update:
Signature:
collection.findAndModify(query, sort, update, options, callback)
The sort parameter is used to specify which object to operate on, if more than one document matches. It takes the same format as the cursor sort (see Connection.find above).
See the MongoDB docs for findAndModify for more details.
Useful options:
remove:true
set to a true to remove the object before returningnew:true
set to true if you want to return the modified object rather than the original. Ignored for remove.upsert:true
Atomically inserts the document if no documents matched.Example for findAndModify
:
var MongoClient = require('mongodb').MongoClient
, format = require('util').format;
MongoClient.connect('mongodb://127.0.0.1:27017/test', function(err, db) {
if(err) throw err;
db.collection('test').findAndModify({hello: 'world'}, [['_id','asc']], {$set: {hi: 'there'}}, {}, function(err, object) {
if (err) console.warn(err.message);
else console.dir(object); // undefined if no matching object exists.
});
});
The save
method is a shorthand for upsert if the document contains an
_id
, or an insert if there is no _id
.
See HISTORY
Aaron Heckmann, Christoph Pojer, Pau Ramon Revilla, Nathan White, Emmerman, Seth LaForge, Boris Filipov, Stefan Schärmeli, Tedde Lundgren, renctan, Sergey Ukustov, Ciaran Jessup, kuno, srimonti, Erik Abele, Pratik Daga, Slobodan Utvic, Kristina Chodorow, Yonathan Randolph, Brian Noguchi, Sam Epstein, James Harrison Fisher, Vladimir Dronnikov, Ben Hockey, Henrik Johansson, Simon Weare, Alex Gorbatchev, Shimon Doodkin, Kyle Mueller, Eran Hammer-Lahav, Marcin Ciszak, François de Metz, Vinay Pulim, nstielau, Adam Wiggins, entrinzikyl, Jeremy Selier, Ian Millington, Public Keating, andrewjstone, Christopher Stott, Corey Jewett, brettkiefer, Rob Holland, Senmiao Liu, heroic, gitfy
Copyright 2009 - 2013 MongoDb Inc.
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.
FAQs
The official MongoDB driver for Node.js
The npm package mongodb receives a total of 5,345,302 weekly downloads. As such, mongodb popularity was classified as popular.
We found that mongodb demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 8 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Research
Security News
Socket researchers uncover a malicious npm package posing as a tool for detecting vulnerabilities in Etherium smart contracts.
Security News
Research
A supply chain attack on Rspack's npm packages injected cryptomining malware, potentially impacting thousands of developers.
Research
Security News
Socket researchers discovered a malware campaign on npm delivering the Skuld infostealer via typosquatted packages, exposing sensitive data.